Learning from queries for maximum information gain in imperfectly learnable problems
نویسندگان
چکیده
In supervised learning, learning from queries rather than from random examples can improve generalization performance significantly. We study the performance of query learning for problems where the student cannot learn the teacher perfectly, which occur frequently in practice. As a prototypical scenario of this kind, we consider a linear perceptron student learning a binary perceptron teacher. Two kinds of queries for maximum information gain, i.e., minimum entropy, are investigated: Minimum student space entropy (MSSE) queries, which are appropriate if the teacher space is unknown, and minimum teacher space entropy (MTSE) queries, which can be used if the teacher space is assumed to be known, but a student of a simpler form has deliberately been chosen. We find that for MSSE queries, the structure of the student space determines the efficacy of query learning, whereas MTSE queries lead to a higher generalization error than random examples, due to a lack of feedback about the progress of the student in the way queries are selected.
منابع مشابه
Minimum entropy queries for linear students learning nonlinear rules
We study the fundamental question of how query learn ing performs in imperfectly learnable problems where the student can only learn to approximate the teacher Considering as a prototypical sce nario a linear perceptron student learning a general nonlinear perceptron teacher we nd that queries for minimum entropy in student space i e maximum information gain lead to the same improvement in gene...
متن کاملLearning from queries for maximuminformation gain in imperfectly learnableproblemsPeter
In supervised learning, learning from queries rather than from random examples can improve generalization performance significantly. We study the performance of query learning for problems where the student cannot learn the teacher perfectly, which occur frequently in practice. As a prototypical scenario of this kind, we consider a linear perceptron student learning a binary perceptron teacher....
متن کاملSeparating Quantum and Classical Learning
We consider a model of learning Boolean functions from quantum membership queries. This model was studied in [26], where it was shown that any class of Boolean functions which is information-theoretically learnable from polynomially many quantum membership queries is also information-theoretically learnable from polynomially many classical membership queries. In this paper we establish a strong...
متن کاملNoise - tolerant learning , the parity problem , and the
We describe a slightly sub-exponential time algorithm for learning parity functions in the presence of random classiication noise. By applying this algorithm to the restricted case of parity functions that depend on only the rst O(log n loglog n) bits of input, we achieve the rst known instance of a polynomial-time noise-tolerant learning algorithm for a concept class that is provably not learn...
متن کاملEquivalences and Separations Between Quantum and Classical Learnability
We consider quantum versions of two well-studied models of learning Boolean functions: Angluin’s model of exact learning from membership queries and Valiant’s Probably Approximately Correct (PAC) model of learning from random examples. For each of these two learning models we establish a polynomial relationship between the number of quantum versus classical queries required for learning. These ...
متن کامل